6,391 research outputs found

    Properties of dust in the detached shells around U Ant, DR Ser, and V644 Sco

    Full text link
    Understanding the properties of dust produced during the asymptotic giant branch phase of stellar evolution is important for understanding the evolution of stars and galaxies. Recent observations of the carbon AGB star R Scl have shown that observations at far-infrared and submillimetre wavelengths can effectively constrain the grain sizes in the shell, while the total mass depends on the structure of the grains (solid vs. hollow or fluffy). We aim to constrain the properties of the dust observed in the submillimetre in the detached shells around the three carbon AGB stars U Ant, DR Ser, and V644 Sco, and to investigate the constraints on the dust masses and grain sizes provided by far-infrared and submm observations. We observed the carbon AGB stars U Ant, DR Ser, and V644 Sco at 870 micron using LABOCA on APEX. Combined with observations from the optical to far-infrared, we produced dust radiative transfer models of the spectral energy distributions (SEDs) with contributions from the stars, present-day mass-loss and detached shells. We tested the effect of different total dust masses and grain sizes on the SED, and attempted to consistently reproduce the SEDs from the optical to the submm. We derive dust masses in the shells of a few 10e-5 Msun, assuming spherical, solid grains. The best-fit grain radii are comparatively large, and indicate the presence of grains between 0.1 micron-2 micron. The LABOCA observations suffer from contamination from 12CO(3-2), and hence gives fluxes that are higher than the predicted dust emission at submm wavelengths. We investigate the effect on the best-fitting models by assuming different degrees of contamination and show that far-infrared and submillimetre observations are important to constrain the dust mass and grain sizes in the shells.Comment: Accepted by A&

    Comparison between mirror Langmuir probe and gas puff imaging measurements of intermittent fluctuations in the Alcator C-Mod scrape-off layer

    Get PDF
    Statistical properties of the scrape-off layer (SOL) plasma fluctuations are studied in ohmically heated plasmas in the Alcator C-Mod tokamak. For the first time, plasma fluctuations as well as parameters that describe the fluctuations are compared across measurements from a mirror Langmuir probe (MLP) and from gas-puff imaging (GPI) that sample the same plasma discharge. This comparison is complemented by an analysis of line emission time-series data, synthesized from the MLP electron density and temperature measurements. The fluctuations observed by the MLP and GPI typically display relative fluctuation amplitudes of order unity together with positively skewed and flattened probability density functions. Such data time series are well described by an established stochastic framework which model the data as a superposition of uncorrelated, two-sided exponential pulses. The most important parameter of the process is the intermittency parameter, {\gamma} = {\tau}d / {\tau}w where {\tau}d denotes the duration time of a single pulse and {\tau}w gives the average waiting time between consecutive pulses. Here we show, using a new deconvolution method, that these parameters can be consistently estimated from different statistics of the data. We also show that the statistical properties of the data sampled by the MLP and GPI diagnostic are very similar. Finally, a comparison of the GPI signal to the synthetic line-emission time series suggests that the measured emission intensity can not be explained solely by a simplified model which neglects neutral particle dynamics

    Stable annual scheduling of medical residents using prioritized multiple training schedules to combat operational uncertainty

    Get PDF
    For educational purposes, medical residents often have to pass through many departments, which place different requirements on them. They are informed about the upcoming departments by an annual training schedule which keeps the individual departments’ service level as constant as possible. Due to poor planning and uncertain events, deviations in the schedule can occur. These deviations affect the service level in the departments, as well as the training progress and satisfaction of the residents. This article analyzes the impact of priorities on residents’ annual planning based on department assignments to combat uncertainty that might result in departmental changes. We present a novel two-stage formulation that combines residents’ tactical planning with duty and daily scheduling’s operational level. We determine an analytical bound for the problem that is superior to the LP bound. Additionally, we approximate a bound based on the solution approach using the objective value of the deterministic solution of an instance and the absences in each scenario. In a computational study, we analyze the performance of various bounds, our solution approach, and the effects of additional priorities in residents’ annual planning. We show that additional priorities can significantly reduce the number of unexpected department assignments. Finally, we derive a practical number of priorities from the results

    Analyzing the accuracy of variable returns to scale data envelopment analysis models

    Get PDF
    The data envelopment analysis (DEA) model is extensively used to estimate efficiency, but no study has determined the DEA model that delivers the most precise estimates. To address this issue, we advance the Monte Carlo simulation-based data generation process proposed by Kohl and Brunner (2020). The developed process generates an artificial dataset using the Translog production function (instead of the commonly used Cobb Douglas) to construct well-behaved scenarios under variable returns to scale (VRS). Using different VRS DEA models, we compute DEA efficiency scores with artificially generated decision-making units (DMUs). We employ five performance indicators followed by a benchmark value and ranking as well as statistical hypothesis tests to evaluate the quality of the efficiency estimates. The procedure allows us to determine which parameters negatively or positively influence the quality of the DEA estimates. It also enables us to identify which DEA model performs the most efficiently over a wide range of scenarios. In contrast to the widely applied BCC (Banker-Charnes-Cooper) model, we find that the Assurance Region (AR) and Slacks-Based Measurement (SBM) DEA models perform better. Thus, we endorse the use of AR and SBM models for DEA applications under the VRS regime

    Galaxy types in the Sloan Digital Sky Survey using supervised artificial neural networks

    Get PDF
    Supervised artificial neural networks are used to predict useful properties of galaxies in the Sloan Digital Sky Survey, in this instance morphological classifications, spectral types and redshifts. By giving the trained networks unseen data, it is found that correlations between predicted and actual properties are around 0.9 with rms errors of order ten per cent. Thus, given a representative training set, these properties may be reliably estimated for galaxies in the survey for which there are no spectra and without human intervention

    Homogeneity and best practice analyses in hospital performance management: an analytical framework

    Get PDF
    Performance modeling of hospitals using data envelopment analysis (DEA) has received steadily increasing attention in the literature. As part of the traditional DEA framework, hospitals are generally assumed to be functionally similar and therefore homogenous. Accordingly, any identified inefficiency is supposedly due to the inefficient use of inputs to produce outputs. However, the disparities in DEA efficiency scores may be a result of the inherent heterogeneity of hospitals. Additionally, traditional DEA models lack predictive capabilities despite having been frequently used as a benchmarking tool in the literature. To address these concerns, this study proposes a framework for analyzing hospital performance by combining two complementary modeling approaches. Specifically, we employ a self-organizing map artificial neural network (SOM-ANN) to conduct a cluster analysis and a multilayer perceptron ANN (MLP-ANN) to perform a heterogeneity analysis and a best practice analysis. The applicability of the integrated framework is empirically shown by an implementation to a large dataset containing more than 1,100 hospitals in Germany. The framework enables a decision-maker not only to predict the best performance but also to explore whether the differences in relative efficiency scores are ascribable to the heterogeneity of hospitals

    Orientifolds of Gepner Models

    Full text link
    We systematically construct and study Type II Orientifolds based on Gepner models which have N=1 supersymmetry in 3+1 dimensions. We classify the parity symmetries and construct the crosscap states. We write down the conditions that a configuration of rational branes must satisfy for consistency (tadpole cancellation and rank constraints) and spacetime supersymmetry. For certain cases, including Type IIB orientifolds of the quintic and a two parameter model, one can find all solutions in this class. Depending on the parity, the number of vacua can be large, of the order of 10^{10}-10^{13}. For other models, it is hard to find all solutions but special solutions can be found -- some of them are chiral. We also make comparison with the large volume regime and obtain a perfect match. Through this study, we find a number of new features of Type II orientifolds, including the structure of moduli space and the change in the type of O-planes under navigation through non-geometric phases.Comment: 142 page
    • …
    corecore